Newsletter from the Desk of Confluent Developer,
From this edition of the Confluent Developer Newsletter, we introduce a shiny new section called “Know Your Developer [KYD].” KYD would asks questions of our friendly Apache Kafka® and Apache Flink® committers, as well as developers in general, about everything that falls under the Data Streaming Platform umbrella.
We start with Robert Yokota, Staff Software Engineer II at Confluent.
Hi, I grew up in the San Francisco Bay Area, and have worked at a number of tech companies over the years, including Sybase, SGI, Sun Microsystems, IBM, Microsoft, and now Confluent. I’ve had the opportunity to work on a number of enterprise software products, but the data streaming space has been the most interesting so far.
When I started at Confluent, I initially joined the Kafka Connect team. During my early days, I was able to make a number of improvements to the Connect ecosystem, including KIP-297. Then, I joined as one of the very first batch of engineers on the Stream Governance team. One of my first tasks was to add Protobuf and JSON Schema support to Schema Registry, which originally supported only Avro.
The goal of stream governance is to ensure the quality, security, and usability of streaming data. Stream governance has both organizational and technical aspects. From an organizational perspective, it encompasses how teams use policies and processes to govern data. From a technical perspective, it encompasses products and technologies that can assist with governance, such as Schema Registry, Data Contracts, Stream Catalog, and Stream Lineage.
Data Quality rules are one constituent part of a more general concept called a “Data Contract,” which augments a schema that can reside in Schema Registry. A Data Contract is a formal agreement between a producer and consumer on the structure and semantics of streaming data. It is comprised of the following:
We’ve recently released a new UI for Confluent Cloud that makes creating and using Data Contracts much easier. In the spirit of Shift Left, the rules in a Data Contract typically run at the source where the data is being produced.
When speaking of Data Contract rules, we’re continuing to make them easier to use and more ubiquitous. We’re adding support for them to all of the different programming languages, including Java, C#, Go, Node.js, and Python.
In-Person:
Virtual:
Stay up to date with all Confluent-run meetup events - by copying the following link into your personal calendar platform:
https://airtable.com/app8KVpxxlmhTbfcL/shrNiipDJkCa2GBW7/iCal
(Instructions for GCal, iCal, Outlook, etc.)
We hope you enjoyed our curated assortment of resources! If you’d like to provide feedback, suggest ideas for content you’d like to see, or you want to submit your own resource for consideration, email us at devx_newsletter@confluent.io!
If you’d like to view previous editions of the newsletter, visit our archive.
If you’re viewing this newsletter online, know that we appreciate your readership and that you can get this newsletter delivered directly to your inbox by filling out the sign-up form on the left-hand side.
P.S. If you want to learn more about Kafka, Flink, or Confluent Cloud, visit our developer site at Confluent Developer.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.